MuraNet: Multi-task Floor Plan Recognition with Relation Attention

نویسندگان

چکیده

The recognition of information in floor plan data requires the use detection and segmentation models. However, relying on several single-task models can result ineffective utilization relevant when there are multiple tasks present simultaneously. To address this challenge, we introduce MuraNet, an attention-based multi-task model for data. In adopt a unified encoder called MURA as backbone with two separated branches: enhanced decoder branch decoupled head based YOLOX, respectively. architecture MuraNet is designed to leverage fact that walls, doors, windows usually constitute primary structure plan's architecture. By jointly training both tasks, believe effectively extract utilize features tasks. Our experiments CubiCasa5k public dataset show improves convergence speed during compared like U-Net YOLOv3. Moreover, observe improvements average AP IoU respectively.Our ablation demonstrate achieves better feature extraction multi-head branches different further performance. We our proposed disadvantages improve accuracy efficiency recognition.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Neural Relation Extraction with Multi-lingual Attention

Relation extraction has been widely used for finding unknown relational facts from the plain text. Most existing methods focus on exploiting mono-lingual data for relation extraction, ignoring massive information from the texts in various languages. To address this issue, we introduce a multi-lingual neural relation extraction framework, which employs monolingual attention to utilize the inform...

متن کامل

Attention-Based LSTM with Multi-Task Learning for Distant Speech Recognition

Distant speech recognition is a highly challenging task due to background noise, reverberation, and speech overlap. Recently, there has been an increasing focus on attention mechanism. In this paper, we explore the attention mechanism embedded within the long short-term memory (LSTM) based acoustic model for large vocabulary distant speech recognition, trained using speech recorded from a singl...

متن کامل

Multi-Agent Plan Recognition: Formalization and Algorithms

Multi-Agent Plan Recognition (MAPR) seeks to identify the dynamic team structures and team behaviors from the observations of the activity-sequences of a set of intelligent agents, based on a library of known team-activities (plan library). It has important applications in analyzing data from automated monitoring, surveillance, and intelligence analysis in general. In this paper, we formalize M...

متن کامل

Action-Model Based Multi-agent Plan Recognition

Multi-Agent Plan Recognition (MAPR) aims to recognize dynamic team structures and team behaviors from the observed team traces (activity sequences) of a set of intelligent agents. Previous MAPR approaches required a library of team activity sequences (team plans) be given as input. However, collecting a library of team plans to ensure adequate coverage is often difficult and costly. In this pap...

متن کامل

Motor Performance in Relation with Sustained Attention in Children with Attention Deficit Hyperactivity Disorder

Objective: Present study compares relationship between motor performance, sustained attention and impulse control in children with Attention Deficit Hyperactivity Disorder and normal children. Materials & Methods: In this descriptive-analytic study, 21 boys with ADHD and 21 normal boys in the age range of 7- 10 years old were participated. Motor performance by using Bruininks Oseretsky Test ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2023

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-031-41498-5_10